Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Event detection without trigger words incorporating syntactic information
Cui WANG, Yafei ZHANG, Junjun GUO, Shengxiang GAO, Zhengtao YU
Journal of Computer Applications    2021, 41 (12): 3534-3539.   DOI: 10.11772/j.issn.1001-9081.2021060928
Abstract246)   HTML6)    PDF (697KB)(101)       Save

Event Detection (ED) is one of the most important tasks in the field of information extraction, aiming to identify instances of specific event types in text. Existing ED methods usually use adjacency matrix to express syntactic dependencies, however, the adjacency matrix often needs to be encoded with Graph Convolutional Network (GCN) to obtain syntactic information, which increases the complexity of the model. Therefore, an event detection method without trigger words incorporating syntactic information was proposed. After converting the dependent parent word and its context into a position marker vector, the word embedding of dependent sub-word was incorporated at the source end of the model in a parameter-free manner to strengthen the semantic representation of the context, without the need of GCN for encoding. In addition, for the time-consuming and laborious labeling of trigger words, a type perceptron based on the multi-head attention mechanism was designed, which was able to model the potential trigger words in the sentence to complete the event detection without trigger words. In order to verify the performance of the proposed method, experiments were conducted on the ACE2005 dataset and the low-resource Vietnamese dataset. Compared with the Event Detection Using Graph Transformer Network (GTN-ED) method, the F1-score of the proposed method was increased by 3.7% on the ACE2005 dataset; compared with the binary classification method Type-aware Bias Neural Network with Attention Mechanisms (TBNNAM), the F1-score of the proposed method was increased by 9% on the Vietnamese dataset. The results show that the integration of syntactic information into Transformer can effectively connect the scattered event information in the sentence to improve the accuracy of event detection.

Table and Figures | Reference | Related Articles | Metrics